16 research outputs found

    An Exploration of the Finance -Growth Nexus: Long Run and Causality Evidences from Selected Countries of SAARC Region

    Get PDF
    The debate on direction of Granger causality between financial development and real sector growth has been growing issue since 1980’s. Researchers are fanatical to empirically discern long run and casual relationship for devising economic policies. This study empirically investigated the finance growth nexus and Causality in the selected countries of SAARC region (Pakistan, India, Nepal and Sri Lanka) using the yearly data set from 1975-2009. The study employed the variables of banking sector as a proxy to financial development. Results of Maddala & Wu and Kao co-integration tests confirm that long run relationship exists between the financial and real sector variables. Result of Causality shows that it runs from real sector growth to financial sector development through proxy of Ratio of Liquid Liabilities to GDP per capita, Ratio of Private Credit by Deposit Money Banks and Financial Institutions to GDP per capita, Ratio of Bank deposits to GDP per capita and Ratio of commercial bank assets to sum of commercial banks plus central bank assets in the SAARC region. Keywords:Financial Sector Development, Real Sector Growth, Panel Co-Integration, VECM, Granger Causality, SAARC regio

    Evaluating Effect of Block Size in Compressed Sensing for Grayscale Images

    Get PDF
    Compressed sensing is an evolving methodology that enables sampling at sub-Nyquist rates and still provides decent signal reconstruction. During the last decade, the reported works have suggested to improve time efficiency by adopting Block based Compressed Sensing (BCS) and reconstruction performance improvement through new algorithms. A trade-off is required between the time efficiency and reconstruction quality. In this paper we have evaluated the significance of block size in BCS to improve reconstruction performance for grayscale images. A parameter variant of BCS [15] based sampling followed by reconstruction through Smoothed Projected Landweber (SPL) technique [16] involving use of Weiner smoothing filter and iterative hard thresholding is applied in this paper. The BCS variant is used to evaluate the effect of block size on image reconstruction quality by carrying out extensive testing on 9200 images acquired from online resources provided by Caltech101 [6], University of Granada [7] and Florida State University [8]. The experimentation showed some consistent results which can improve reconstruction performance in all BCS frameworks including BCS-SPL [17] and its variants [19], [27]. Firstly, the effect of varying block size (4x4, 8x8, 16x16, 32x32 and 64x64) results in changing the Peak Signal to Noise Ratio (PSNR) of reconstructed images from at least 1 dB to a maximum of 16 dB. This challenges the common notion that bigger block sizes always result in better reconstruction performance. Secondly, the variation in reconstruction quality with changing block size is mostly dependent on the image visual contents. Thirdly, images having similar visual contents, irrespective of the size, e.g., those from the same category of Caltech101 [6] gave majority vote for the same Optimum Block Size (OBS). These focused notes may help improve BCS based image capturing at many of the existing applications. For example, experimental results suggest using block size of 8x8 or 16x16 to capture facial identity using BCS. Fourthly, the average processing time taken for BCS and reconstruction through SPL with Lapped transform of Discrete Cosine Transform as the sparifying basis remained 300 milli-seconds for block size of 4x4 to 5 seconds for block size of 64x64. Since the processing time variation remains less than 5 seconds, selecting the OBS may not affect the time constraint in many applications. Analysis reveals that no particular block size is able to provide optimum reconstruction for all images with varying nature of visual contents. Therefore, the selection of block size should be made specific to the particular type of application images depending upon their visual contents

    Landmark Based Audio Fingerprinting for Naval Vessels

    Get PDF
    This paper presents a novel landmark based audio fingerprinting algorithm for matching naval vessels' acoustic signatures. The algorithm incorporates joint time - frequency based approach with parameters optimized for application to acoustic signatures of naval vessels. The technique exploits the relative time difference between neighboring frequency onsets, which is found to remain consistent in different samples originating over time from the same vessel. The algorithm has been implemented in MATLAB and trialed with real acoustic signatures of submarines. The training and test samples of submarines have been acquired from resources provided by San Francisco National Park Association [14]. Storage requirements to populate the database with 500 tracks allowing a maximum of 0.5 Million feature hashes per track remained below 1GB. On an average PC, the database hash table can be populated with feature hashes of database tracks @ 1250 hashes/second achieving conversion of 120 seconds of audio data into hashes in less than a second. Under varying attributes such as time skew, noise and sample length, the results prove algorithm robustness in identifying a correct match. Experimental results show classification rate of 94% using proposed approach which is a considerable improvement as compared to 88% achieved by [17] employing existing state of the art techniques such as Detection Envelope Modulation On Noise (DEMON) [15] and Low Frequency Analysis and Recording (LOFAR) [16]

    Novel DEMON Spectra Analysis Techniques and Empirical Knowledge Based Reference Criterion for Acoustic Signal Classification

    Get PDF
    This paper presents some novel methods to estimate a vessel’s number of shafts, course, speed and classify it using the underwater acoustic noise it generates. A classification framework as well as a set of reference parameters for comparison are put forth. Identifying marine traffic in surroundings is an important task for vessels in an open sea. Vessels in vicinity can be identified using their signatures. One of the typical signatures emitted by a vessel is its acoustic measurements. The raw sonar data consisting of the acoustic signatures is generally observed manually by sonar operators for suggesting class of query vessel. The valuable information that can be extracted from the recorded acoustic signature includes shaft revolutions per minute (SRPM), number of blades (NOB), number of shafts, course and speed etc. Expert sonar operators use their empirical knowledge to estimate a vessel’s SRPM and NOB. Based on this information vessel classification is performed. Empirical knowledge comes with experience, and the manual process is prone to human error. To make the process systematic, calculation of the parameters of the received acoustic samples can be visually analyzed using Detection of Envelope Modulation on Noise (DEMON) spectra. Reported research mostly focuses on SRPM and NOB. Parameters such as number of shafts and vessel course and speed can effectively aid the vessel classification process. This paper makes three novel contributions in this area. Firstly, some novel DEMON spectra analysis techniques are proposed to estimate a water vessel’s number of shafts, speed, and relative course. Secondly, this paper presents a classification framework that uses the features extracted from DEMON spectra and compares them with a reference set. Thirdly, a novel set of reference parameters are provided that aid classification into categories of large merchant ship type 1, large merchant ship type 2, large merchant ship type 3, medium merchant ship, oiler, car carrier, cruise ship, fishing boat and fishing trawler. The proposed analysis and classification techniques were assessed through trials with 877 real acoustic signatures recorded under varying conditions of ship’s speed and sea state. The classification trials revealed a high accuracy of 94.7%

    Principals Perception Regarding Factors Affecting The Performance Of Teachers

    Get PDF
    This study investigated the perception of principals on how the factors of subject mastery, teaching methodology, personal characteristics, and attitude toward students affect the performance of teachers at higher secondary level in the Punjab. All principals of higher secondary level in the Punjab were part of the population of the study. From the population, 120 principals were selected as the sample. A questionnaire was developed and validated through pilot testing. The data obtained were tabulated and analyzed by using statistical techniques of mean and standard deviation. The major conclusions of the study were that the factor of subject mastery was perceived to be influencing the performance of teachers maximally, but the factor of attitude toward students was affecting the performance of teachers minimally. The remaining two factors - teaching methodology and personal characteristics - were perceived to be at the intermediary level

    A Comprehensive Review of Vehicle Detection Techniques Under Varying Moving Cast Shadow Conditions Using Computer Vision and Deep Learning

    Get PDF
    Design of a vision-based traffic analytic system for urban traffic video scenes has a great potential in context of Intelligent Transportation System (ITS). It offers useful traffic-related insights at much lower costs compared to their conventional sensor based counterparts. However, it remains a challenging problem till today due to the complexity factors such as camera hardware constraints, camera movement, object occlusion, object speed, object resolution, traffic flow density, and lighting conditions etc. ITS has many applications including and not just limited to queue estimation, speed detection and different anomalies detection etc. All of these applications are primarily dependent on sensing vehicle presence to form some basis for analysis. Moving cast shadows of vehicles is one of the major problems that affects the vehicle detection as it can cause detection and tracking inaccuracies. Therefore, it is exceedingly important to distinguish dynamic objects from their moving cast shadows for accurate vehicle detection and recognition. This paper provides an in-depth comparative analysis of different traffic paradigm-focused conventional and state-of-the-art shadow detection and removal algorithms. Till date, there has been only one survey which highlights the shadow removal methodologies particularly for traffic paradigm. In this paper, a total of 70 research papers containing results of urban traffic scenes have been shortlisted from the last three decades to give a comprehensive overview of the work done in this area. The study reveals that the preferable way to make a comparative evaluation is to use the existing Highway I, II, and III datasets which are frequently used for qualitative or quantitative analysis of shadow detection or removal algorithms. Furthermore, the paper not only provides cues to solve moving cast shadow problems, but also suggests that even after the advent of Convolutional Neural Networks (CNN)-based vehicle detection methods, the problems caused by moving cast shadows persists. Therefore, this paper proposes a hybrid approach which uses a combination of conventional and state-of-the-art techniques as a pre-processing step for shadow detection and removal before using CNN for vehicles detection. The results indicate a significant improvement in vehicle detection accuracies after using the proposed approach

    On the Efficiency of Software Implementations of Lightweight Block Ciphers from the Perspective of Programming Languages

    Get PDF
    Lightweight block ciphers are primarily designed for resource constrained devices. However, due to service requirements of large-scale IoT networks and systems, the need for efficient software implementations can not be ruled out. A number of studies have compared software implementations of different lightweight block ciphers on a specific platform but to the best of our knowledge, this is the first attempt to benchmark various software implementations of a single lightweight block cipher across different programming languages and platforms in the cloud architecture. In this paper, we defined six lookup-table based software implementations for lightweight block ciphers with their characteristics ranging from memory to throughput optimized variants. We carried out a thorough analysis of the two costs associated with each implementation (memory and operations) and discussed possible trade-offs in detail. We coded all six types of implementations for three key settings (64, 80, 128 bits) of LED (a lightweight block cipher) in four programming languages (Java, C#, C++, Python). We highlighted the impact of choice relating to implementation type, programming language, and platform by benchmarking the seventy-two implementations for throughput and software efficiency on 32 & 64-bit platforms for two major operating systems (Windows & Linux) on Amazon Web Services Cloud. The results showed that these choices can affect the efficiency of a cryptographic primitive by a factor as high as 400

    Analysis of codon usage bias of lumpy skin disease virus causing livestock infection

    Get PDF
    Lumpy skin disease virus (LSDV) causes lumpy skin disease (LSD) in livestock, which is a double-stranded DNA virus that belongs to the genus Capripoxvirus of the family Poxviridae. LSDV is an important poxvirus that has spread out far and wide to become distributed worldwide. It poses serious health risks to the host and causes considerable negative socioeconomic impact on farmers financially and on cattle by causing ruminant-related diseases. Previous studies explained the population structure of the LSDV within the evolutionary time scale and adaptive evolution. However, it is still unknown and remains enigmatic as to how synonymous codons are used by the LSDV. Here, we used 53 LSDV strains and applied the codon usage bias (CUB) analysis to them. Both the base content and the relative synonymous codon usage (RSCU) analysis revealed that the AT-ended codons were more frequently used in the genome of LSDV. Further low codon usage bias was calculated from the effective number of codons (ENC) value. The neutrality plot analysis suggested that the dominant factor of natural selection played a role in the structuring of CUB in LSDV. Additionally, the results from a comparative analysis suggested that the LSDV has adapted host-specific codon usage patterns to sustain successful replication and transmission chains within hosts (Bos taurus and Homo sapiens). Both natural selection and mutational pressure have an impact on the codon usage patterns of the protein-coding genes in LSDV. This study is important because it has characterized the codon usage pattern in the LSDV genomes and has provided the necessary data for a basic evolutionary study on them

    الإتجاھات الإسلاميۃ في أدب الأطفال عند کامل الکیلاني: ISLAMIC TRENDS IN CHILDREN'S LITERATURE ACCORDING TO KAMEL AL-KILANI

    No full text
    Writers and critics differed about the real beginnings of children's literature, so there is an opinion that it is ancient from the feet of man on earth and linked its appearance to the stories that were told in the past by mothers and grandmothers to their children, those stories that were mostly stemming from ancient myths and legends, but they were full of wisdom. Encouraging goodness, for example, Ismail Abdel-Fattah, believes that children's literature was found with the existence of morals, as mothers used to tell their children bedtime stories, and calm their beds with melodies and beautiful words, so it was wonderful literature, but it is not written.And this situation continued until this literature was known at the end of the last century. ۔ Children's literature existed in the past, but it was oral, unwritten literature, delivered to young children in the form of stories and bedtime incidents by mothers. Perhaps it is older than other literary genres because it coincides with the emergence of language itself and its association with forms of expression of human life, and simple expression is the one that represents innate life, and depicts human affection, the affection of things and parenting towards the child in a clear expression.Kamel Al-Kilani is considered the legitimate father of children's literature in the Arabic language, and the leader of the school of writers for young people in the Arabic language as a whole.He wrote about a thousand stories and dramas for children, whether translated, quoted or authored, and he wrote dozens of tracks, poems, songs and chants that fall with children from their infancy until they reach the age of youth.He wrote many stories and plays, in addition to pieces and poems for children

    Systems biology based meth-miRNA–mRNA regulatory network identifies metabolic imbalance and hyperactive cell cycle signaling involved in hepatocellular carcinoma onset and progression

    No full text
    Abstract Background Hepatocellular carcinoma (HCC) is one of the leading cause of cancer associated deaths worldwide. Independent studies have proposed altered DNA methylation pattern and aberrant microRNA (miRNA) levels leading to abnormal expression of different genes as important regulators of disease onset and progression in HCC. Here, using systems biology approaches, we aimed to integrate methylation, miRNA profiling and gene expression data into a regulatory methylation-miRNA–mRNA (meth-miRNA–mRNA) network to better understand the onset and progression of the disease. Methods Patients’ gene methylation, miRNA expression and gene expression data were retrieved from the NCBI GEO and TCGA databases. Differentially methylated genes, and differentially expressed miRNAs and genes were identified by comparing respective patients’ data using two tailed Student’s t-test. Functional annotation and pathway enrichment, miRNA–mRNA inverse pairing and gene set enrichment analyses (GSEA) were performed using DAVID, miRDIP v4.1 and GSEA tools respectively. meth-miRNA–mRNA network was constructed using Cytoscape v3.5.1. Kaplan–Meier survival analyses were performed using R script and significance was calculated by Log-rank (Mantel-Cox) test. Results We identified differentially expressed mRNAs, miRNAs, and differentially methylated genes in HCC as compared to normal adjacent tissues by analyzing gene expression, miRNA expression, and methylation profiling data of HCC patients and integrated top miRNAs along with their mRNA targets and their methylation profile into a regulatory meth-miRNA–mRNA network using systems biology approach. Pathway enrichment analyses of identified genes revealed suppressed metabolic pathways and hyperactive cell cycle signaling as key features of HCC onset and progression which we validated in 10 different HCC patients’ datasets. Next, we confirmed the inverse correlation between gene methylation and its expression, and between miRNA and its targets’ expression in various datasets. Furthermore, we validated the clinical significance of identified methylation, miRNA and mRNA signatures by checking their association with clinical features and survival of HCC patients. Conclusions Overall, we suggest that simultaneous (1) reversal of hyper-methylation and/or oncogenic miRNA driven suppression of genes involved in metabolic pathways, and (2) induction of hyper-methylation and/or tumor suppressor miRNA driven suppression of genes involved in cell cycle signaling have potential of inhibiting disease aggressiveness, and predicting good survival in HCC
    corecore